Computationally efficient probabilistic inference with noisy threshold models based on a CP tensor decomposition
نویسنده
چکیده
Conditional probability tables (CPTs) of threshold functions represent a generalization of two popular models – noisy-or and noisy-and. They constitute an alternative to these two models in case they are too rough. When using the standard inference techniques the inference complexity is exponential with respect to the number of parents of a variable. In case the CPTs take a special form (in this paper it is the noisy-threshold model) more efficient inference techniques could be employed. Each CPT defined for variables with finite number of states can be viewed as a tensor (a multilinear array). Tensors can be decomposed as linear combinations of rank-one tensors, where a rank one tensor is an outer product of vectors. Such decomposition is referred to as Canonical Polyadic (CP) or CANDECOMP-PARAFAC (CP) decomposition. The tensor decomposition offers a compact representation of CPTs which can be efficiently utilized in probabilistic inference. In this paper we propose a CP decomposition of tensors corresponding to CPTs of threshold functions and their noisy counterparts. We performed experiments on subnetworks of the well-known QMR-DT network generalized by replacing noisy-or by noisy-threshold models. Each generated subnetwork contained more then one hundred variables. The results of our experiments reveal that by using the suggested decomposition of CPTs we can get computational savings in several orders of magnitude.
منابع مشابه
Probabilistic inference with noisy-threshold models based on a CP tensor decomposition
The specification of conditional probability tables (CPTs) is a difficult task in the construction of probabilistic graphical models. Several types of canonical models have been proposed to ease that difficulty. Noisy-threshold models generalize the two most popular canonical models: the noisy-or and the noisy-and. When using the standard inference techniques the inference complexity is exponen...
متن کاملInfinite Tucker Decomposition: Nonparametric Bayesian Models for Multiway Data Analysis
Tensor decomposition is a powerful computational tool for multiway data analysis. Many popular tensor decomposition approaches—such as the Tucker decomposition and CANDECOMP/PARAFAC (CP)—amount to multi-linear factorization. They are insufficient to model (i) complex interactions between data entities, (ii) various data types (e.g.missing data and binary data), and (iii) noisy observations and ...
متن کاملInfTucker: t-Process based Infinite Tensor Decomposition
Tensor decomposition is a powerful tool for multiway data analysis. Many popular tensor decomposition approaches—such as the Tucker decomposition and CANDECOMP/PARAFAC (CP)—conduct multi-linear factorization. They are insufficient to model (i) complex interactions between data entities, (ii) various data types (e.g. missing data and binary data), and (iii) noisy observations and outliers. To ad...
متن کاملAn Approximate Tensor-Based Inference Method Applied to the Game of Minesweeper
We propose an approximate probabilistic inference method based on the CP-tensor decomposition and apply it to the well known computer game of Minesweeper. In the method we view conditional probability tables of the exactly -out-of-k functions as tensors and approximate them by a sum of rank-one tensors. The number of the summands is min{l + 1, k − l + 1}, which is lower than their exact symmetr...
متن کاملVectorial Dimension Reduction for Tensors Based on Bayesian Inference
Dimensionality reduction for high-order tensors is a challenging problem. In conventional approaches, higher order tensors are “vectorized” via Tucker decomposition to obtain lower order tensors. This will destroy the inherent high-order structures or resulting in undesired tensors, respectively. This paper introduces a probabilistic vectorial dimensionality reduction model for tensorial data. ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2012